35 research outputs found

    Global-referenced navigation grids for off-road vehicles and environments

    Full text link
    [EN] The presence of automation and information technology in agricultural environments seems no longer questionable; smart spraying, variable rate fertilizing, or automatic guidance are becoming usual management tools in modern farms. Yet, such techniques are still in their nascence and offer a lively hotbed for innovation. In particular, significant research efforts are being directed toward vehicle navigation and awareness in off-road environments. However, the majority of solutions being developed are based on occupancy grids referenced with odometry and dead-reckoning, or alternatively based on GPS waypoint following, but never based on both. Yet, navigation in off-road environments highly benefits from both approaches: perception data effectively condensed in regular grids, and global references for every cell of the grid. This research proposes a framework to build globally referenced navigation grids by combining three-dimensional stereo vision with satellite-based global positioning. The construction process entails the in-field recording of perceptual information plus the geodetic coordinates of the vehicle at every image acquisition position, in addition to other basic data as velocity, heading, or GPS quality indices. The creation of local grids occurs in real time right after the stereo images have been captured by the vehicle in the field, but the final assembly of universal grids takes place after finishing the acquisition phase. Vehicle-fixed individual grids are then superposed onto the global grid, transferring original perception data to universal cells expressed in Local Tangent Plane coordinates. Global referencing allows the discontinuous appendage of data to succeed in the completion and updating of navigation grids along the time over multiple mapping sessions. This methodology was validated in a commercial vineyard, where several universal grids of the crops were generated. Vine rows were correctly reconstructed, although some difficulties appeared around the headland turns as a consequence of unreliable heading estimations. Navigation information conveyed through globally referenced regular grids turned out to be a powerful tool for upcoming practical implementations within agricultural robotics. (C) 2011 Elsevier B.V. All rights reserved.The author would like to thank Juan Jose Pena Suarez and Montano Perez Teruel for their assistance in the preparation of the prototype vehicle, Veronica Saiz Rubio for her help during most of the field experiments, Ratul Banerjee for his contribution in the development of software, and Luis Gil-Orozco Esteve for granting permission to perform multiple tests in the vineyards of his winery Finca Ardal. Gratitude is also extended to the Spanish Ministry of Science and Innovation for funding this research through project AGL2009-11731.Rovira Más, F. (2011). Global-referenced navigation grids for off-road vehicles and environments. Robotics and Autonomous Systems. 60(2):278-287. https://doi.org/10.1016/j.robot.2011.11.007S27828760

    Global 3D Terrain Maps for Agricultural Applications

    Get PDF

    GPS data conditioning for enhancing reliability of automated off-road vehicles

    Full text link
    [EN] The practical implementation of precision agriculture at a large scale has not occurred yet for several reasons. Among them, the lack of uniformity and reliability in global positioning has discouraged many producers from adopting advanced solutions which, while considered to add a significant value to their production systems, cannot be incorporated without guarantees of minimum levels of long-term consistency. Although substantial improvements are constantly being introduced by receiver manufacturers, positioning errors can appear at the final stages of the localization process, resulting in inaccuracies and anomalies normally undetected by embedded quality filters. This article proposes an actuation protocol to enhance the robustness of GPS information for practical agricultural applications. The algorithm embodying this strategy merges partially-acquired raw strings into complete US National Marine Electronics Association messages whose information fields are checked for consistency. Once data qualifies as stable, other logic filters are applied to reinforce the likelihood of obtaining proper locations. Extensive field tests demonstrated that the algorithm was able to discard most erroneous positions due to typical GPS errors and poor signal reception in complex agricultural environments. However, the phenomena of coordinate quantization and random outliers were still present, which indicates that further redundancy is necessary to avoid unreliable outcomes. In this regard, positive results for supplementary consistency from GPS-based vehicle heading and speed are anticipated.Our gratitude is extended to the Spanish Ministry of Science and Innovation for funding this research through project AGL2009-11731.Rovira Más, F.; Banerjee, R. (2012). GPS data conditioning for enhancing reliability of automated off-road vehicles. Proceedings of the Institution of Mechanical Engineers, Part D: Journal of Automobile Engineering. 227(4):521-535. https://doi.org/10.1177/0954407012454976521535227

    Sensor architecture and task classification for agricultural vehicles and environments

    Get PDF
    [EN] The long time wish of endowing agricultural vehicles with an increasing degree of autonomy is becoming a reality thanks to two crucial facts: the broad diffusion of global positioning satellite systems and the inexorable progress of computers and electronics. Agricultural vehicles are currently the only self-propelled ground machines commonly integrating commercial automatic navigation systems. Farm equipment manufacturers and satellite-based navigation system providers, in a joint effort, have pushed this technology to unprecedented heights; yet there are many unresolved issues and an unlimited potential still to uncover. The complexity inherent to intelligent vehicles is rooted in the selection and coordination of the optimum sensors, the computer reasoning techniques to process the acquired data, and the resulting control strategies for automatic actuators. The advantageous design of the network of onboard sensors is necessary for the future deployment of advanced agricultural vehicles. This article analyzes a variety of typical environments and situations encountered in agricultural fields, and proposes a sensor architecture especially adapted to cope with them. The strategy proposed groups sensors into four specific subsystems: global localization, feedback control and vehicle pose, non-visual monitoring, and local perception. The designed architecture responds to vital vehicle tasks classified within three layers devoted to safety, operative information, and automatic actuation. The success of this architecture, implemented and tested in various agricultural vehicles over the last decade, rests on its capacity to integrate redundancy and incorporate new technologies in a practical wayThe research activities devoted to the study of sensor and system architectures for agricultural intelligent vehicles carried out during 2010 have been supported by the Spanish Ministry of Science and Innovation through Project AGL2009-11731.Rovira Más, F. (2010). Sensor architecture and task classification for agricultural vehicles and environments. Sensors. 10(12):11226-11247. https://doi.org/10.3390/s101211226S1122611247101

    Design parameters for adjusting the visual field of binocular stereo cameras

    Full text link
    [EN] Stereoscopic cameras are becoming fundamental sensors for providing perception capabilities for automated vehicles; however, they need to be adequately setup to avoid excessive data processing and unreliable outcomes. Combinations of baselines and lens focal lengths were optimised to adjust the field of view of a stereo camera to provide the two fundamental perceptions required for intelligent vehicles: safeguarding distances around 6 m and look-ahead distances up to 20 m for automatic guidance. The main objective was to develop a systematic procedure to find the parameters that best sense the desired field of view. Quantitative indices to estimate perceptive quality, such as relative errors and efficiencies, were defined and applied to particular cases. Experiments, both in the laboratory and outdoor, led to the conclusion that short ranges under 6 m from the vehicle were best acquired with 8 mm lenses and baselines ranging from 100 mm to 150 mm, whereas 200 mm baselines coupled with 12 mm and 8 mm lenses were more suitable for longer look-ahead distances. These experiments also proved the utility of the methodology proposed. ª 2009 IAgrE. Published by Elsevier Ltd. All rights reserved.The material presented in this paper was based upon work supported partially by the Ministry of Education and Science Funds, Spain (AGL2006-09656/AGR), the United States Department of Agriculture (USDA) Hatch Funds (ILLU-10-352 AE) and Bruce Cowgur Mid-Tech Memorial Funds. Any opinions, findings, and conclusions expressed in this publication are those of the authors and do not necessarily reflect the views of the University of Illinois, USA; the Ministry of Education and Science, Spain; the USDA, USA; and Midwest Technologies Inc., USA.Rovira Más, F. (2010). Design parameters for adjusting the visual field of binocular stereo cameras. Biosystems Engineering. 105(1):59-70. https://doi.org/10.1016/j.biosystemseng.2009.09.013S5970105

    Proximal sensing mapping method to generate field maps in vineyards

    Get PDF
    [EN] An innovative methodology to generate vegetative vigor maps in vineyards (Vitis vinifera L.) has been developed and pre-validated. The architecture proposed implements a Global Positioning System (GPS) receiver and a computer vision unit comprising a monocular charge-coupled device (CCD) camera equipped with an 8-mm lens and a pass-band near-infrared (NIR) filter. Both sensors are mounted on a medium-size conventional agricultural tractor. The synchronization of perception (camera) and localization (GPS) sensors allowed the creation of globally-referenced regular grids, denominated universal grids, whose cells were filled with the estimated vegetative vigor of the monitored vines. Vine vigor was quantified as the relative percentage of vegetation automatically estimated by the onboard algorithm through the images captured with the camera. Validation tests compared spatial differences in vine vigor with yield differentials along the rows. The positive correlation between vigor and yield variations showed the potential of proximal sensing and the advantages of acquiring top view images from conventional vehicles.Sáiz Rubio, V.; Rovira Más, F. (2013). Proximal sensing mapping method to generate field maps in vineyards. Agricultural Engineering International: CIGR Journal. 15(2):47-59. http://hdl.handle.net/10251/102750S475915

    From Smart Farming towards Agriculture 5.0: A Review on Crop Data Management

    Full text link
    [EN] The information that crops offer is turned into profitable decisions only when efficiently managed. Current advances in data management are making Smart Farming grow exponentially as data have become the key element in modern agriculture to help producers with critical decision-making. Valuable advantages appear with objective information acquired through sensors with the aim of maximizing productivity and sustainability. This kind of data-based managed farms rely on data that can increase efficiency by avoiding the misuse of resources and the pollution of the environment. Data-driven agriculture, with the help of robotic solutions incorporating artificial intelligent techniques, sets the grounds for the sustainable agriculture of the future. This paper reviews the current status of advanced farm management systems by revisiting each crucial step, from data acquisition in crop fields to variable rate applications, so that growers can make optimized decisions to save money while protecting the environment and transforming how food will be produced to sustainably match the forthcoming population growth.This research article is part of a project that has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement No 737669.Sáiz Rubio, V.; Rovira Más, F. (2020). From Smart Farming towards Agriculture 5.0: A Review on Crop Data Management. Agronomy. 10(2):1-21. https://doi.org/10.3390/agronomy10020207S121102Himesh, S. (2018). Digital revolution and Big Data: a new revolution in agriculture. CAB Reviews: Perspectives in Agriculture, Veterinary Science, Nutrition and Natural Resources, 13(021). doi:10.1079/pavsnnr201813021Digital Agriculture: Improving Profitabilityhttps://www.accenture.com/_acnmedia/accenture/conversion-assets/dotcom/documents/global/pdf/digital_3/accenture-digital-agriculture-point-of-view.pdfDigital Farming: What Does It Really Mean?http://www.cema-agri.org/publication/digital-farming-what-does-it-really-meanAgriculture Needs to Attract More Young Peoplehttp://www.gainhealth.org/knowledge-centre/worlds-farmers-age-new-blood-neededGenerational Renewalhttps://enrd.ec.europa.eu/enrd-thematic-work/generational-renewal_enWhat is IoT in Agriculture? Farmers Aren’t Quite Sure Despite $4bn US Opportunity—Reporthttps://agfundernews.com/iot-agriculture-farmers-arent-quite-sure-despite-4bn-us-opportunity.htmlPrecision Agriculture Yields Higher Profits, Lower Riskshttps://www.hpe.com/us/en/insights/articles/precision-agriculture-yields-higher-profits-lower-risks-1806.htmlTzounis, A., Katsoulas, N., Bartzanas, T., & Kittas, C. (2017). Internet of Things in agriculture, recent advances and future challenges. Biosystems Engineering, 164, 31-48. doi:10.1016/j.biosystemseng.2017.09.007From Dirt to Data: The Second Green Revolution and IoT. Deloitte insightshttps://www2.deloitte.com/insights/us/en/deloitte-review/issue-18/second-green-revolution-and-internet-of-things.html#endnote-sup-9Big Data: The Next Frontier for Innovation, Competition, and Productivity | McKinseyhttps://www.mckinsey.com/business-functions/mckinsey-digital/our-insights/big-data-the-next-frontier-for-innovationWolfert, S., Ge, L., Verdouw, C., & Bogaardt, M.-J. (2017). Big Data in Smart Farming – A review. Agricultural Systems, 153, 69-80. doi:10.1016/j.agsy.2017.01.023Kamilaris, A., Kartakoullis, A., & Prenafeta-Boldú, F. X. (2017). A review on the practice of big data analysis in agriculture. Computers and Electronics in Agriculture, 143, 23-37. doi:10.1016/j.compag.2017.09.037How Big Data Will Change Agriculturehttps://proagrica.com/news/how-big-data-will-change-agriculture/Big Data Coordination Platform. Proposal to the CGIAR Fund Councilhttps://cgspace.cgiar.org/handle/10947/4303Zambon, I., Cecchini, M., Egidi, G., Saporito, M. G., & Colantoni, A. (2019). Revolution 4.0: Industry vs. Agriculture in a Future Development for SMEs. Processes, 7(1), 36. doi:10.3390/pr7010036How AI Is Transforming Agriculturehttps://www.forbes.com/sites/cognitiveworld/2019/07/05/how-ai-is-transforming-agriculture/Bechar, A., & Vigneault, C. (2016). Agricultural robots for field operations: Concepts and components. Biosystems Engineering, 149, 94-111. doi:10.1016/j.biosystemseng.2016.06.014Bechar, A., & Vigneault, C. (2017). Agricultural robots for field operations. Part 2: Operations and systems. Biosystems Engineering, 153, 110-128. doi:10.1016/j.biosystemseng.2016.11.004Ramin Shamshiri, R., Weltzien, C., A. Hameed, I., J. Yule, I., … E. Grift, T. (2018). Research and development in agricultural robotics: A perspective of digital farming. International Journal of Agricultural and Biological Engineering, 11(4), 1-11. doi:10.25165/j.ijabe.20181104.4278Farming 4.0: The Future of Agriculture?https://www.euractiv.com/section/agriculture-food/infographic/farming-4-0-the-future-of-agriculture/Ag Tech Deal Activity More Than Tripleshttps://www.cbinsights.com/research/agriculture-farm-tech-startup-funding-trends/AI, Robotics, And the Future of Precision Agriculturehttps://www.cbinsights.com/research/ai-robotics-agriculture-tech-startups-future/VineScout European Projectwww.vinescout.euPrecision Farming: A New Approach to Crop Managementhttp://agpublications.tamu.edu/pubs/eng/l5177.pdfZhang, N., Wang, M., & Wang, N. (2002). Precision agriculture—a worldwide overview. Computers and Electronics in Agriculture, 36(2-3), 113-132. doi:10.1016/s0168-1699(02)00096-0MIAO, Y., MULLA, D. J., & ROBERT, P. C. (2018). An integrated approach to site-specific management zone delineation. Frontiers of Agricultural Science and Engineering, 0(0), 0. doi:10.15302/j-fase-2018230Klassen, S. P., Villa, J., Adamchuk, V., & Serraj, R. (2014). Soil mapping for improved phenotyping of drought resistance in lowland rice fields. Field Crops Research, 167, 112-118. doi:10.1016/j.fcr.2014.07.007Khanal, S., Fulton, J., & Shearer, S. (2017). An overview of current and potential applications of thermal remote sensing in precision agriculture. Computers and Electronics in Agriculture, 139, 22-32. doi:10.1016/j.compag.2017.05.001Aravind, K. R., Raja, P., & Pérez-Ruiz, M. (2017). Task-based agricultural mobile robots in arable farming: A review. Spanish Journal of Agricultural Research, 15(1), e02R01. doi:10.5424/sjar/2017151-9573Roldán, J. J., Cerro, J. del, Garzón‐Ramos, D., Garcia‐Aunon, P., Garzón, M., León, J. de, & Barrientos, A. (2018). Robots in Agriculture: State of Art and Practical Experiences. Service Robots. doi:10.5772/intechopen.69874Gonzalez-de-Santos, P., Ribeiro, A., Fernandez-Quintanilla, C., Lopez-Granados, F., Brandstoetter, M., Tomic, S., … Debilde, B. (2016). Fleets of robots for environmentally-safe pest control in agriculture. Precision Agriculture, 18(4), 574-614. doi:10.1007/s11119-016-9476-3What’s Slowing the Use of Robots in the Ag Industry?https://www.therobotreport.com/whats-slowing-the-use-of-robots-in-the-ag-industry/Bogue, R. (2016). Robots poised to revolutionise agriculture. Industrial Robot: An International Journal, 43(5), 450-456. doi:10.1108/ir-05-2016-0142Features & Benefits OZ Weeding Robothttps://www.naio-technologies.com/en/agricultural-equipment/weeding-robot-oz/Robotics for Sustainable Broad-Acre Agriculturehttps://www.researchgate.net/publication/283722961_Robotics_for_Sustainable_Broad-Acre_AgricultureThe Ultimate Guide to Agricultural Roboticshttps://www.roboticsbusinessreview.com/agriculture/the_ultimate_guide_to_agricultural_robotics/Kweon, G., Lund, E., & Maxton, C. (2013). Soil organic matter and cation-exchange capacity sensing with on-the-go electrical conductivity and optical sensors. Geoderma, 199, 80-89. doi:10.1016/j.geoderma.2012.11.001Agricultural Robots—Present and Future Applications (Videos Included)https://emerj.com/ai-sector-overviews/agricultural-robots-present-future-applications/Köksal, Ö., & Tekinerdogan, B. (2018). Architecture design approach for IoT-based farm management information systems. Precision Agriculture, 20(5), 926-958. doi:10.1007/s11119-018-09624-8Rovira-Más, F., & Sáiz-Rubio, V. (2013). Crop Biometric Maps: The Key to Prediction. Sensors, 13(9), 12698-12743. doi:10.3390/s130912698Oliver, M. A., & Webster, R. (2014). A tutorial guide to geostatistics: Computing and modelling variograms and kriging. CATENA, 113, 56-69. doi:10.1016/j.catena.2013.09.006Adamchuk, V. ., Hummel, J. ., Morgan, M. ., & Upadhyaya, S. . (2004). On-the-go soil sensors for precision agriculture. Computers and Electronics in Agriculture, 44(1), 71-91. doi:10.1016/j.compag.2004.03.002Cossell, S., Whitty, M., Liu, S., & Tang, J. (2016). Spatial Map Generation from Low Cost Ground Vehicle Mounted Monocular Camera. IFAC-PapersOnLine, 49(16), 231-236. doi:10.1016/j.ifacol.2016.10.043N. Zhang, & R. K. Taylor. (2001). APPLICATIONS OF A FIELD LEVEL GEOGRAPHIC INFORMATION SYSTEM (FIS) IN PRECISION AGRICULTURE. Applied Engineering in Agriculture, 17(6). doi:10.13031/2013.6829Runquist, S., Zhang, N., & Taylor, R. K. (2001). Development of a field-level geographic information system. Computers and Electronics in Agriculture, 31(2), 201-209. doi:10.1016/s0168-1699(00)00155-1Granular Farm Management Software, Precision Agriculture, Agricultural Softwarehttps://granular.ag/Capterra. Farm Management Softwarewww.capterra.comTop 9 Farm Management Software—Compare Reviews, Features, Pricing in 2019https://www.predictiveanalyticstoday.com/top-farm-management-software/Srivastava, P. K., & Singh, R. M. (2016). GIS based integrated modelling framework for agricultural canal system simulation and management in Indo-Gangetic plains of India. Agricultural Water Management, 163, 37-47. doi:10.1016/j.agwat.2015.08.025Giusti, E., & Marsili-Libelli, S. (2015). A Fuzzy Decision Support System for irrigation and water conservation in agriculture. Environmental Modelling & Software, 63, 73-86. doi:10.1016/j.envsoft.2014.09.020Asfaw, D., Black, E., Brown, M., Nicklin, K. J., Otu-Larbi, F., Pinnington, E., … Quaife, T. (2018). TAMSAT-ALERT v1: a new framework for agricultural decision support. Geoscientific Model Development, 11(6), 2353-2371. doi:10.5194/gmd-11-2353-2018https://dssat.netNavarro-Hellín, H., Martínez-del-Rincon, J., Domingo-Miguel, R., Soto-Valles, F., & Torres-Sánchez, R. (2016). A decision support system for managing irrigation in agriculture. Computers and Electronics in Agriculture, 124, 121-131. doi:10.1016/j.compag.2016.04.003Kumar, A., Sah, B., Singh, A. R., Deng, Y., He, X., Kumar, P., & Bansal, R. C. (2017). A review of multi criteria decision making (MCDM) towards sustainable renewable energy development. Renewable and Sustainable Energy Reviews, 69, 596-609. doi:10.1016/j.rser.2016.11.191Rupnik, R., Kukar, M., Vračar, P., Košir, D., Pevec, D., & Bosnić, Z. (2019). AgroDSS: A decision support system for agriculture and farming. Computers and Electronics in Agriculture, 161, 260-271. doi:10.1016/j.compag.2018.04.001Rose, D. C., Sutherland, W. J., Parker, C., Lobley, M., Winter, M., Morris, C., … Dicks, L. V. (2016). Decision support tools for agriculture: Towards effective design and delivery. Agricultural Systems, 149, 165-174. doi:10.1016/j.agsy.2016.09.009Colaço, A. F., & Molin, J. P. (2016). Variable rate fertilization in citrus: a long term study. Precision Agriculture, 18(2), 169-191. doi:10.1007/s11119-016-9454-9Nawar, S., Corstanje, R., Halcro, G., Mulla, D., & Mouazen, A. M. (2017). Delineation of Soil Management Zones for Variable-Rate Fertilization. Advances in Agronomy, 175-245. doi:10.1016/bs.agron.2017.01.003Fountas, S., Carli, G., Sørensen, C. G., Tsiropoulos, Z., Cavalaris, C., Vatsanidou, A., … Tisserye, B. (2015). Farm management information systems: Current situation and future perspectives. Computers and Electronics in Agriculture, 115, 40-50. doi:10.1016/j.compag.2015.05.011Precision Agriculture in Europe: Legal, Social and Ethical Considerations—Think Tankhttp://www.europarl.europa.eu/thinktank/en/document.html?reference=EPRS_STU(2017)60320

    Bifocal Stereoscopic Vision for Intelligent Vehicles

    Get PDF
    The numerous benefits of real-time 3D awareness for autonomous vehicles have motivated the incorporation of stereo cameras to the perception units of intelligent vehicles. The availability of the distance between camera and objects is essential for such applications as automatic guidance and safeguarding; however, a poor estimation of the position of the objects in front of the vehicle can result in dangerous actions. There is an emphasis, therefore, in the design of perception engines that can make available a rich and reliable interval of ranges in front of the camera. The objective of this research is to develop a stereo head that is capable of capturing 3D information from two cameras simultaneously, sensing different, but complementary, fields of view. In order to do so, the concept of bifocal perception was defined and physically materialized in an experimental bifocal stereo camera. The assembled system was validated through field tests, and results showed that each stereo pair of the head excelled at a singular range interval. The fusion of both intervals led to a more faithful representation of reality

    Augmented Perception for Agricultural Robots Navigation

    Full text link
    [EN] Producing food in a sustainable way is becoming very challenging today due to the lack of skilled labor, the unaffordable costs of labor when available, and the limited returns for growers as a result of low produce prices demanded by big supermarket chains in contrast to ever-increasing costs of inputs such as fuel, chemicals, seeds, or water. Robotics emerges as a technological advance that can counterweight some of these challenges, mainly in industrialized countries. However, the deployment of autonomous machines in open environments exposed to uncertainty and harsh ambient conditions poses an important defiance to reliability and safety. Consequently, a deep parametrization of the working environment in real time is necessary to achieve autonomous navigation. This article proposes a navigation strategy for guiding a robot along vineyard rows for field monitoring. Given that global positioning cannot be granted permanently in any vineyard, the strategy is based on local perception, and results from fusing three complementary technologies: 3D vision, lidar, and ultrasonics. Several perception-based navigation algorithms were developed between 2015 and 2019. After their comparison in real environments and conditions, results showed that the augmented perception derived from combining these three technologies provides a consistent basis for outlining the intelligent behavior of agricultural robots operating within orchards.This work was supported by the European Union Research and Innovation Programs under Grant N. 737669 and Grant N. 610953. The associate editor coordinating the review of this article and approving it for publication was Dr. Oleg Sergiyenko.Rovira Más, F.; Sáiz Rubio, V.; Cuenca-Cuenca, A. (2021). Augmented Perception for Agricultural Robots Navigation. IEEE Sensors Journal. 21(10):11712-11727. https://doi.org/10.1109/JSEN.2020.3016081S1171211727211

    Sensing Architecture for Terrestrial Crop Monitoring: Harvesting Data as an Asset

    Full text link
    [EN] Very often, the root of problems found to produce food sustainably, as well as the origin of many environmental issues, derive from making decisions with unreliable or inexistent data. Datadriven agriculture has emerged as a way to palliate the lack of meaningful information when taking critical steps in the field. However, many decisive parameters still require manual measurements and proximity to the target, which results in the typical undersampling that impedes statistical significance and the application of AI techniques that rely on massive data. To invert this trend, and simultaneously combine crop proximity with massive sampling, a sensing architecture for automating crop scouting from ground vehicles is proposed. At present, there are no clear guidelines of how monitoring vehicles must be configured for optimally tracking crop parameters at high resolution. This paper structures the architecture for such vehicles in four subsystems, examines the most common components for each subsystem, and delves into their interactions for an efficient delivery of high-density field data from initial acquisition to final recommendation. Its main advantages rest on the real time generation of crop maps that blend the global positioning of canopy location, some of their agronomical traits, and the precise monitoring of the ambient conditions surrounding such canopies. As a use case, the envisioned architecture was embodied in an autonomous robot to automatically sort two harvesting zones of a commercial vineyard to produce two wines of dissimilar characteristics. The information contained in the maps delivered by the robot may help growers systematically apply differential harvesting, evidencing the suitability of the proposed architecture for massive monitoring and subsequent data-driven actuation. While many crop parameters still cannot be measured non-invasively, the availability of novel sensors is continually growing; to benefit from them, an efficient and trustable sensing architecture becomes indispensable.This research was funded by the European Union's Horizon 2020 research and innovation program with grant agreement number 737669 entitled VineScout: Intelligent decisions from vineyard robots.Rovira Más, F.; Saiz Rubio, V.; Cuenca-Cuenca, A. (2021). Sensing Architecture for Terrestrial Crop Monitoring: Harvesting Data as an Asset. Sensors. 21(9):1-24. https://doi.org/10.3390/s21093114S12421
    corecore